Neural Network Approximation for Time Splitting Random Functions

نویسندگان

چکیده

In this article we present the multivariate approximation of time splitting random functions defined on a box or RN,N∈N, by neural network operators quasi-interpolation type. We achieve these approximations obtaining quantitative-type Jackson inequalities engaging modulus continuity related function its partial high-order derivatives. use density to define our operators. These derive from logistic and hyperbolic tangent sigmoid activation functions. Our convergences are both point-wise uniform. The engaged feed-forward networks possess one hidden layer. finish with great variety applications.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Fractional Differential Polynomial Neural Network for Approximation of Functions

In this work, we introduce a generalization of the differential polynomial neural network utilizing fractional calculus. Fractional calculus is taken in the sense of the Caputo differential operator. It approximates a multi-parametric function with particular polynomials characterizing its functional output as a generalization of input patterns. This method can be employed on data to describe m...

متن کامل

GDOP Classification and Approximation by Implementation of Time Delay Neural Network Method for Low-Cost GPS Receivers

Geometric Dilution of Precision (GDOP) is a coefficient for constellations of Global Positioning System (GPS) satellites. These satellites are organized geometrically. Traditionally, GPS GDOP computation is based on the inversion matrix with complicated measurement equations. A new strategy for calculation of GPS GDOP is construction of time series problem; it employs machine learning and artif...

متن کامل

Nonlinear Approximation of Random Functions

Given an orthonormal basis and a certain class X of vectors in a Hilbert space H, consider the following nonlinear approximation process: approach a vector x ∈ X by keeping only its N largest coordinates, and let N go to infinity. In this paper, we study the accuracy of this process in the case where H = L(I), and we use either the trigonometric system or a wavelet basis to expand this space. T...

متن کامل

Verification of an Evolutionary-based Wavelet Neural Network Model for Nonlinear Function Approximation

Nonlinear function approximation is one of the most important tasks in system analysis and identification. Several models have been presented to achieve an accurate approximation on nonlinear mathematics functions. However, the majority of the models are specific to certain problems and systems. In this paper, an evolutionary-based wavelet neural network model is proposed for structure definiti...

متن کامل

Wavelet Neural Network with Random Wavelet Function Parameters

The training algorithm of Wavelet Neural Networks (WNN) is a bottleneck which impacts on the accuracy of the final WNN model. Several methods have been proposed for training the WNNs. From the perspective of our research, most of these algorithms are iterative and need to adjust all the parameters of WNN. This paper proposes a one-step learning method which changes the weights between hidden la...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2023

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math11092183